Capture
A Capture is an object basically storing the raw data captured by a Sensor.
Each Capture has:
time_stamp_A time stamp.A variable for storing the raw data. To be defined in derived classes.
sensor_ptr_A pointer to the Sensor that captured the data.Dynamic sensor parameters. In particular, extrinsic and/or intrinsic sensor parameters, if they are dynamic (that is, time-varying), are stored in the capture acquired by that sensor. for more info.
A list of Features extracted from the raw data.
Capture base classes
We provide two base classes for Captures, from which you can derive your own implementations. They are plain Captures, and motion Captures. Quick descriptions follow.
CaptureBase
The constructor allows us to introduce the basic information for the
Capture. Notice that the state blocks for the dynamic sensor parameters
default to nullptr.
CaptureBase(const std::string& _type,
const TimeStamp& _ts,
SensorBasePtr _sensor_ptr = nullptr,
const TypeComposite& _state_types = {},
const VectorComposite& _state_vectors = {},
const PriorComposite& _state_priors = {});
In CaptureBase, since it is an abstract class, there is no way to store
the raw data because the type of this raw data is unknown. Possible raw
data types can be very diverse, e.g. images, velocities, IMU readings,
laser scans, point clouds, GNSS fixes, GNSS ranges and navigation
messages, etc… Therefore, it will be in classes deriving from
CaptureBase where raw data can be stored.
See also
Check the CaptureBase class code on GitLab
CaptureMotion
A base class used to integrate motion data is provided in WOLF. This
class incorporates a buffer of motion data, together with pre-integrated
motion deltas, their covariances and Jacobians. Such CaptureMotion class
is used by all Processors deriving from ProcessorMotion,
which is the WOLF’s base class for all processors integrating motion.
CaptureMotion classes require knowledge on a few motion-related aspects:
Capture where the pre-integration started. We call it the ‘origin’.
In case of dynamic sensor parameters, the value of such parameters at the current time stamp.
We provide two constructors,
CaptureMotion(const std::string& _type,
const TimeStamp& _ts,
SensorBasePtr _sensor_ptr,
const Eigen::VectorXd& _data,
CaptureBasePtr _capture_origin_ptr,
const TypeComposite& _state_types = {},
const VectorComposite& _state_vectors = {},
const PriorComposite& _state_priors = {});
CaptureMotion(const std::string& _type,
const TimeStamp& _ts,
SensorBasePtr _sensor_ptr,
const Eigen::VectorXd& _data,
const Eigen::MatrixXd& _data_cov,
CaptureBasePtr _capture_origin_ptr,
const TypeComposite& _state_types = {},
const VectorComposite& _state_vectors = {},
const PriorComposite& _state_priors = {});
See also
Check the CaptureMotion class code on GitLab
The motion buffer
The motion buffer is stored internally in CaptureMotion, and
gets incremented by new calls to process() with new data. This is
done by the ProcessorMotion, and the
regular WOLF user should not worry too much about it.
For reference, we provide the basic structure of data constituting this buffer.
First, there is the structure Motion holding a snapshot of the motion at
a particular time,
time stamp
motion data and covariance
current motion increment, or delta, and covariance
pre-integrated delta and cov
some useful Jacobians for later use
Then, the class MotionBuffer is essentially a std::list<Motion>,
which comes with a minimal API,
class MotionBuffer : public std::list<Motion>
{
public:
MotionBuffer();
const Motion& getMotion(const TimeStamp& _ts) const;
void getMotion(const TimeStamp& _ts, Motion& _motion) const;
void split(const TimeStamp& _ts, MotionBuffer& _oldest_buffer);
void print(bool show_data = 0,
bool show_delta = 0,
bool show_delta_int = 0,
bool show_jacs = 0,
bool show_covs = 0);
};
See also